Feature Space Interpretation of SVMs with non Positive Definite Kernels
نویسنده
چکیده
The widespread habit of “plugging” arbitrary symmetric functions as kernels in support vector machines (SVMs) often yields good empirical classification results. However, in case of non conditionally positive definite (non-cpd) functions they are hard to interpret due to missing geometrical and theoretical understanding. In this paper we provide a step towards comprehension of SVM classifiers in these situations. We give a geometric interpretation of SVMs with non-cpd kernel functions. We show that such SVMs are optimal hyperplane classifiers not by margin maximization but by minimization of distances between convex hulls in pseudo-Euclidean spaces. This interpretation is basis for further analysis, e.g. investigating uniqueness or characterizing situations where SVMs with noncpd kernels are suitable or not.
منابع مشابه
تعیین ماشینهای بردار پشتیبان بهینه در طبقهبندی تصاویر فرا طیفی بر مبنای الگوریتم ژنتیک
Hyper spectral remote sensing imagery, due to its rich source of spectral information provides an efficient tool for ground classifications in complex geographical areas with similar classes. Referring to robustness of Support Vector Machines (SVMs) in high dimensional space, they are efficient tool for classification of hyper spectral imagery. However, there are two optimization issues which s...
متن کاملSubspace Learning in Krein Spaces: Complete Kernel Fisher Discriminant Analysis with Indefinite Kernels
Positive definite kernels, such as Gaussian Radial Basis Functions (GRBF), have been widely used in computer vision for designing feature extraction and classification algorithms. In many cases nonpositive definite (npd) kernels and non metric similarity/dissimilarity measures naturally arise (e.g., Hausdorff distance, Kullback Leibler Divergences and Compact Support (CS) Kernels). Hence, there...
متن کاملCoulomb Classifiers: Generalizing Support Vector Machines via an Analogy to Electrostatic Systems
We introduce a family of classifiers based on a physical analogy to an electrostatic system of charged conductors. The family, called Coulomb classifiers, includes the two best-known support-vector machines (SVMs), the ν–SVM and the C–SVM. In the electrostatics analogy, a training example corresponds to a charged conductor at a given location in space, the classification function corresponds to...
متن کاملDiscovering Domain-Specific Composite Kernels
Kernel-based data mining algorithms, such as Support Vector Machines, project data into high-dimensional feature spaces, wherein linear decision surfaces correspond to non-linear decision surfaces in the original feature space. Choosing a kernel amounts to choosing a high-dimensional feature space, and is thus a crucial step in the data mining process. Despite this fact, and as a result of the ...
متن کاملRandom Feature Maps for Dot Product Kernels Supplementary Material
This document contains detailed proofs of theorems stated in the main article entitled Random Feature Maps for Dot Product Kernels. 1 Proof of Theorem 1 We first recollect Schoenberg’s result in its original form Theorem 1 (Schoenberg (1942), Theorem 2). A function f : [−1, 1]→ R constitutes a positive definite kernel K : S∞ × S∞ → R, K : (x,y) 7→ f(〈x,y〉) iff f is an analytic function admittin...
متن کامل